![]() Device, method and system for determining flight height of unmanned aerial vehicle
专利摘要:
The present disclosure relates to a device, a method and a system for determining the flight height of an unmanned aerial vehicle. The determining device comprises that a camera at the bottom of the unmanned aerial vehicle is carried on a 5 three-axis self-stabilizing tripod head; a carrier phase difference satellite positioning system, a graphics processing computer and a power supply system are all provided on the top of the unmanned aerial vehicle; the graphic processing computer is connected with the carrier phase 10 difference satellite positioning system, the power supply system and the camera, respectively; an attitude and heading reference system is provided at the bottom of the three-axis self-stabilizing tripod head and is connected with the graphics processing computer; the graphic processing computer 15 is configured to determine the relative height of the unmanned aerial vehicle from the canopy of farmland surface crops according to the position information acquired by the carrier phase difference satellite positioning system, the attitude information acquired by the attitude and heading reference 20 system and the ground orthographic image acquired by the camera, and determine the flight height of the unmanned aerial vehicle according to the relative height. According to the present disclosure, the accuracy of determining the flight height and the stability of data are realized. 公开号:NL2027359A 申请号:NL2027359 申请日:2021-01-21 公开日:2021-03-25 发明作者:He Yong;He Liwen;Chu Bingquan 申请人:Univ Zhejiang; IPC主号:
专利说明:
DEVICE, METHOD AND SYSTEM FOR DETERMINING FLIGHT HEIGHT OFUNMANNED AERIAL VEHICLE TECHNICAL FIELD The present disclosure relates to the field of unmanned aerial vehicles, in particular to a device, a method and a system for determining the flight height of an unmanned aerial vehicle. BACKGROUND When using an unmanned aerial vehicle to acquire information of agricultural and forestry crops, because the optical imaging task device has the requirements of focal length and focal points, the unmanned aerial vehicle needs to keep a relatively stable distance from the photographed crops as much as possible during flight. If the stability cannot be kept at a relative altitude, therewill often be "out of focus", which will lead to blurred images and acquiring no information that has been learned. When operating in plain areas, especially in leveled standard farmland, the unmanned aerial vehicle can fly at a fixed altitude. However, in terraced fields or hilly areas, unmanned aerial vehicles need to fly like the ground according to terrain data or other airborne devices. At present, the accuracy of commonly used geographic information data is about 1 meter, and the height of crop canopy is not expressed. Common airborne devices such as a laser rangefinder and an ultrasonic rangefinder use the method of measuring several points and averaging to calculate the relative height. There are many problems here. Because of the interference of canopy density, soil moisture, air humidity and solar radiation, the relative height between the unmanned aerial vehicle and the crop canopy cannot be monitored stably. SUMMARY The object of the present disclosure is to provide a device, a method and a system for determining the flight height of an unmanned aerial vehicle, which can stably monitor the relative height between the unmanned aerial vehicle and the crop canopy, thereby realizing the accuracy of determining the flight height and the stability of data. To achieve the above object, the present disclosure provides the following scheme: A device for determining the flight height of an unmanned aerial vehicle, comprising an unmanned aerial vehicle, a three-axis self-stabilizing tripod head, a carrier phase difference satellite positioning system, an attitude and heading reference system, a graphics processing computer and a power supply system, wherein; a camera at the bottom of the unmanned aerial vehicle is carried on the three-axis self-stabilizing tripod head, and the three-axis self-stabilizing tripod head is used to maintain the optical axis of the camera; the carrier phase difference satellite positioning system, the graphics processing computer and the power supply system are all provided on the top of the unmanned aerial vehicle; the graphic processing computer is connected with the carrier phase difference satellite positioning system, the power supply system and the camera, respectively; the attitude and heading reference system is provided at the bottom of the three-axis self-stabilizing tripod head and is connected with the graphics processing computer; the graphic processing computer is configured to determine the relative height of the unmanned aerial vehicle from the canopy of farmland surface crops according to the position information acquired by the carrier phase difference satellite positioning system, the attitude information acquired by the attitude and heading reference system and the ground orthographic image acquired by the camera, and determine the flight height of the unmanned aerial vehicle according to the relative height. Preferably, the camera is a high-resolution visible light camera. The present disclosure relates to a method for determining the flight height of an unmanned aerial vehicle, wherein the determining method is applied to the device for determining the flight height of an unmanned aerial vehicle, and the determining method comprises: acquiring a plurality of ground orthographic images in an agricultural operation area and the position data and attitude data of an unmanned aerial vehicle at the time of acquiring the plurality of ground orthographic images; constructing a Gaussian difference pyramid according to pixel points in any two adjacent ground orthographic images; determining the feature points in any two adjacent ground orthographic images according to the Gaussian difference pyramid; matching the feature points in two adjacent ground orthographic images to determine a matching feature point pair; determining the position of the matching feature point pair in space according to the matching feature point pair; determining the position of the unmanned aerial vehicle in space according to the position data and attitude data of the unmanned aerial vehicle; determining the relative height of the unmanned aerial vehicle from the canopy of farmland surface crops according to the position of the unmanned aerial vehicle in space and the position of the matching feature point pair in space; determining the flight height of the unmanned aerial vehicle according to the relative height. Preferably, prior to constructing a Gaussian difference pyramid according to pixel points in any two adjacent ground orthographic images, the method further comprises: calibrating pixel points in the ground orthographic image by using formulas Xoreteg = MIA Kr + kr + kr ) Vorretog = V1 + kr + kort + kr) 26 Xero = X T12PXy + pLr® + 2x5] ng Yeorrected TY +{p(r +2) + 2p,xy] . where ki, ks, and ks are radial distortion factors, p: and Pp: are tangential distortion factors, x and y are pixel point coordinates, Xcorrected and Veorrectea are the coordinates of the calibrated pixel points, and r is the distance from the pixel point to the image center point. Preferably, matching the feature points in two adjacent ground orthographic images to determine a matching feature point pair specifically comprises: constructing a feature description vector corresponding to each feature point according to the feature points in the ground orthographic image; determining the Euclidean distance between the feature points in one ground orthographic image and the feature points in another ground orthographic image according to the feature description vector corresponding to each feature point; taking the feature point pair whose Euclidean distance is less than the distance threshold as the matching feature point pair, wherein the matching feature point pair comprises two feature points and is located in different ground orthographic images. A system for determining the flight height of an unmanned aerial vehicle comprises: a data acquiring module configured to acquire a plurality of ground orthographic images in an agricultural operation area and the position data and attitude data of an unmanned aerial vehicle at the time of acquiring the plurality of ground orthographic images; a Gaussian difference pyramid constructing module configured to construct a Gaussian difference pyramid according to pixel points in any two adjacent ground orthographic images; a feature point determining module configured to determine the feature points in any two adjacent ground orthographic images according to the Gaussian difference pyramid; a matching feature point pair determining module configured to match the feature points in two adjacent ground orthographic images to determine a matching feature point pair; a position determining module of a matching feature point pair in space configured to determine the position of the matching feature point pair in space according to the matching 5 feature point pair; a position determining module of an unmanned aerial vehicle in space configured to determine the position of the unmanned aerial vehicle in space according to the position data and attitude data of the unmanned aerial vehicle; a relative height determining module configured to determine the relative height of the unmanned aerial vehicle from the canopy of farmland surface crops according to the position of the unmanned aerial vehicle in space and the position of the matching feature point pair in space; a flight height determining module configured to determine the flight height of the unmanned aerial vehicle according to the relative height. Preferably, the system further comprises: a pixel point calibrating module configured to calibrate pixel points in the ground orthographic image by using formulas Xoreneg = M1 + Kr + kr + kr ) Vooreres = V1 + KI + kr + Kir ) Xorrees = X H120XY + pr + 2x%)] and Voorrected = ¥ +[o{r° +2) + 2pxy] where ki, kz, and k3 are radial distortion factors, pi and p: are tangential distortion factors, x and y are pixel point coordinates, Xeorrected and Veorreotea are the coordinates of the calibrated pixel points, and r is the distance from the pixel point to the image center point. Preferably, the matching feature point pair determining module specifically comprises: a feature description vector constructing unit configured to construct a feature description vector corresponding to each feature point according to the feature points in the ground orthographic image; an Fuclidean distance determining unit configured to determine the Euclidean distance between the feature points in one ground orthographic image and the feature points in another ground orthographic image according to the feature description vector corresponding to each feature point; a matching feature point pair determining unit configured to take the feature point pair whose Euclidean distance is less than the distance threshold as the matching feature point pair, wherein the matching feature point pair comprises two feature points and is located in different ground orthographic images. According to the specific embodiment provided by the present disclosure, the present disclosure discloses the following technical effects: According to the device, method and system for determining the flight height of an unmanned aerial vehicle, the longitude and latitude coordinates of the current unmanned aerial vehicle, the attitude information of the camera and the ground orthographic image are acquired in real time during the flight of the unmanned aerial vehicle, the relative height between the current unmanned aerial vehicle and the canopy of farmland surface crops is calculated, and the flight height of the unmanned aerial vehicle is determined according to the relative height. Under the guidance of the relative height, when unmanned aerial vehicle flies like the ground, it can objectively control the flying altitude according to the crop canopy, so as to achieve the accuracy of acquiring information and the stability of data. BRIEF DESCRIPTION OF THE DRAWINGS In order to explain the embodiments of the present disclosure or the technical scheme in the prior art more clearly, the drawings needed in the embodiments will be briefly introduced hereinafter. Obviously, the drawings in the following description are only some embodiments of the present disclosure. For those skilled in the art, other drawings can be obtained according to these drawings without paying creative labor. FIG. 1 is a structural schematic diagram of a device for determining the flight height of an unmanned aerial vehicle according to the present disclosure; FIG. 2 is a flow chart of a method for determining the £light height of an unmanned aerial vehicle according to the present disclosure; FIG. 3 is a schematic diagram of a black-and-white checkerboard calibration board according to the present disclosure; FIG. 4 is a structural schematic diagram of a system for determining the flight height of an unmanned aerial vehicle according to the present disclosure. DETAILED DESCRIPTION The technical scheme in the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure hereinafter. Obviously, the described embodiments are only some embodiments of the present disclosure, rather than all of the embodiments. Based on the embodiments of the present disclosure, all other embodiments obtained by those skilled in the art without paying creative labor belong to the scope of protection of the present disclosure. The present disclosure aims to provide a device, a method and a system for determining the flight height of an unmanned aerial vehicle, In order to make the above objects, features and advantages of the present disclosure more obvious and understandable, the present disclosure will be further explained in detail with reference to the drawings and specific embodiments hereinafter. FIG. 1 is a structural schematic diagram of a device for determining the flight height of an unmanned aerial vehicle according to the present disclosure. As shown in FIG. 1, the device for determining the flight height of an unmanned aerial vehicle according to the present disclosure comprises an unmanned aerial vehicle 1, a three-axis self-stabilizing tripod head 2, a carrier phase difference satellite positioning system 3, an attitude and heading reference system 4, a graphics processing computer 5 and a power supply system 6. The camera 7 at the bottom of the unmanned aerial vehicle l is carried on the three-axis self-stabilizing tripod head 2, and the three-axis self-stabilizing tripod head 2 is used to maintain the optical axis of the camera 7. The camera 7 is a high-resolution visible light camera. The carrier phase difference satellite positioning system 3, the graphics processing computer 5 and the power supply system 6 are all provided on the top of the unmanned aerial vehicle 1. The graphic processing computer 5 is connected with the carrier phase difference satellite positioning system 3, the power supply system 6 and the camera 7, respectively. The attitude and heading reference system 4 is provided at the bottom of the three-axis self-stabilizing tripod head 2 and is connected with the graphics processing computer 5. The graphic processing computer 5 is configured to determine the relative height of the unmanned aerial vehicle 1 from the canopy of farmland surface crops according to the position information acquired by the carrier phase difference satellite positioning system 3, the attitude information acquired by the attitude and heading reference system 4 and the ground orthographic image acquired by the camera 7, and determine the flight height of the unmanned aerial vehicle according to the relative height. The provided specific working process of the device for determining the flight height of the unmanned aerial vehicle is as follows. When the unmanned aerial vehicle 1 flies in the agricultural operation area with complex terrain, the graphics processing computer 5 can continuously send out trigger instructions according to the currently measured longitude and latitude coordinate data, control the high-resolution visible light camera 7 carried on the three-axis self-stabilizing tripod head 2 to continuously complete acquisition of a plurality of images in a vertical downward posture, and control the carrier phase difference satellite positioning system 3 and the attitude and heading reference system 4 to acquire the current position information and attitude information of the high-resolution visible light camera 7. The above information is transmitted to the high-performance graphics processing computer 5 through the data line for processing. FIG. 2 is a flow diagram of a method for determining the flight height of an unmanned aerial vehicle according to the present disclosure. As shown in FIG. 2, the method for determining the flight height of an unmanned aerial vehicle according to the present disclosure is applied to the device for determining the flight height of an unmanned aerial vehicle described above, and the determining method comprises the following steps. 5201, a plurality of ground orthographic images in an agricultural operation area and the position data and attitude data of an unmanned aerial vehicle at the time of acquiring the plurality of ground orthographic images are acquired. Before using the camera, the internal parameters of the visible light camera are calibrated by using the black-and-white checkerboard with an interval of 25mm, as shown in FIG. 3. More than 8 images of the calibration board with different angles need to be acquired, and then are input into the opencv running library to automatically calculate the calibration parameters by using the Complete Camera Toolbox of Calibration. 5202, a Gaussian difference pyramid is constructed according to pixel points in any two adjacent ground orthographic images. Prior to S202, the method further comprises: calibrating pixel points in the ground orthographic image by using formulas Xoreteg = X14 Kr + krt + kr ) Vorretog = V1 + kr + kort + kr) Xoorrereg = X T12PXy + pLr® + 2x5] ng Yeorrected TY +{p(r +2) + 2p,xy] . where ki, ks, and ks are radial distortion factors, p: and Pp: are tangential distortion factors, x and y are pixel point coordinates, Xcorrected and Veorrectea are the coordinates of the calibrated pixel points, and r is the distance from the pixel point to the image center point. The specific process of constructing a Gaussian difference pyramid is as follows. The Gaussian pyramid is divided into O groups, and each group is divided into S layers. The resolution of images in each group is the same. With the increase of pyramid height, the image becomes more blurred. The number of layers S of the pyramid is four. Its original image resolution determines the number of groups of the Gaussian pyramid. The calculating formula is as follows: O=| log, mn(X,Y) 2] where X and Y represent the length and width of the original image, and L leans rounding down. To construct the Gaussian difference pyramid, it is necessary to blur and smooth the original image, and generate the image layer by layer, thus forming the LOG (Laplace of Gaussians) scale space of L(x.y.0) . It is calculated by convolution of Gaussian function G(x.y.0) and {mage (xy) . The formula is L(x,y,9) 5 G(x,y,9) © (x.y) , where x and vy represent the coordinate values of the horizontal axis and the vertical axis of pixel points on the image, ® represents convolution calculation, and the calculation formula of G(x.y.0), as follows: G(x,y,0) = Sze where 0 is the scale space factor, and the larger the value 0, the larger the image processing range, and the smoother the image, x and y are the pixel point coordinates. Assume that the scale space factor of the original image is 0.5. That is, the image scale 09,9 of group 0 and layer 0 of the Gaussian pyramid is 0.5, so that the calculation formula of the scale space factor of group p and layer q is as follows: pg = 2" Oo The LOG scale space L(x.y.0) can be calculated by combining the above formulas. The DOG (Difference of Gaussians) difference Gaussian pyramid can be calculated by making difference between adjacent lower images and upper images in each group. For example, group p and layer g+l are subtracted from group p and layer q in LOG space to obtain the image of group p and layer q of the DOG difference Gaussian pyramid. After the DOG difference Gaussian pyramid is constructed, it is comprehensively judged whether the pixel point is an extreme point in the range of adjacent pixel points in the current scale space and the adjacent scale space, and then the position of feature points can be obtained. $5203, the feature points in any two adjacent ground orthographic images are determined according to the Gaussian difference pyramid. 5204, the feature points are matched in two adjacent ground orthographic images to determine a matching feature point pair. 5204 specifically comprises: constructing a feature description vector corresponding to each feature point according to the feature points in the ground orthographic image; determining the Euclidean distance between the feature points in one ground orthographic image and the feature points in another ground orthographic image according to the feature description vector corresponding to each feature point; taking the feature point pair whose Euclidean distance is less than the distance threshold as the matching feature point pair, wherein the matching feature point pair comprises two feature points and 1s located in different ground orthographic images. As a specific embodiment, the specific matching process of feature points is as follows. According to the position information of a feature point, the scale value oc of the feature point can be known, and the Gaussian image of the scale value can be obtained according to this scale value. The amplitude angle and the amplitude value of the pixel gradient of each point in the image are calculated in the area with the radius3 1.90 | The calculation formula is as follows: m(x,y) = (L(x +1 y) - L(x ~1,y)) +(L(x,y +1) -L(xy 1) Si +1) — L(x, y 2 6 (x.y) = arctan TTT TT (x +Ly)-L(x-1y) After the amplitude angle and the amplitude value are calculated, the amplitude angle from 0° to 360° is divided into 36 intervals, each interval is 10°, and the sum of the amplitude values of the points falling in each amplitude value interval is counted. The direction in which the sum of the amplitude values is maximal is the main direction of the feature point. With the feature point as the center, the position and the direction of pixel points in the area 1520 * 15V20 near the feature point are rotated until the main direction of the feature point coincides with the x axis. After rotation, an area 120 "126 35 taken from the rotated image with the feature point as the center, and the area is divided into 4 *4 sub-regions at equal intervals. The size of each area is 30. 0° to 360° is divided into 8 angle intervals according to the range of 45° in each sub-region. The amplitude accumulation value of each angle interval is calculated, and Gaussian weighting processing is carried out according to the distance fromeach point to the feature point, forming a 128-dimensional SIFT feature vector as the descriptor of the feature point. The Euclidean distance between two vectors is calculated according to the feature vectors of feature points of one of the two images and the feature vectors of all feature points of the other image. If the ratio of the distance to the nearest feature point and the distance to the next closest feature point is less than 0.79, then the feature point and the feature point with the closest distance in the other image can be considered as paired as a pair of feature points. By filtering the matched feature points, the pixel point positions (unit: pixel): (wn, hs) and (Wai, Dm) Of the matched feature points in a rectangular coordinate system (hereinafter referred to as an image coordinate system) with the upper left corner of the image as the origin, the positive direction of x axis from left to right and the positive direction of y axis from top to bottom. S205, the position of the matching feature point pair in space is determined according to the matching feature point pair. The position of the matching feature point pair in space A y, |=WDTTJ is (x, yi, Zi); “i , where I and J are both matrices. IT represents the transposition of a matrix. I and J are as follows: WB, ME OW Bi, — 1), W Hy, — hy | Woy, HL, WING, MM, WAM ML, - wos =m wom =m wml mt walt =m wml nt — ml my, —w, mij, : nin, — hm, | mit — wml ni — hm where My represents the elements in row p and column q of the image feature matrix M, that can be calculated in the n-th image. The calculation formula of M, is as follows (the same is for Mu): 0 0, O M, = 0 8g, 0 ol i 0 ì 0 1 where o, and o, respectively represent the coordinate values (unit: pixel) of the central pixel point in the image coordinate system in x axis and y axis in the n-th image. The parameter g, and gy and the matrix T, are calculated according to the following formula: e= €, g = €, T‚=[x, » Z] In the above formula, f represents the focal length (unit: mm) of the visible light camera lens, and e, and ey, are the length corresponding to each pixel on the camera photosensitive element in x axis and y axis with the unit of mm/pixel. According to the above formula, the coordinate information {%+, Ve, Zi) Of the feature points in the real map coordinate system is calculated. 5206, the position of the unmanned aerial vehicle in space is determined according to the position data and attitude data of the unmanned aerial vehicle. Taking the n-th image and the (n+l)-th image as an example, based on the data in step 2, the three-dimensional space coordinates (Xa, Var Za) in the three-dimensional space coordinate system (hereinafter referred to as the real map coordinate system) with RTK-GPS system reference station as the origin, x-axis forward direction in the east, y-axis forward direction in the north and z-axis forward direction in the vertical direction when taking the n-th image, and the three-dimensional space coordinates (Xa:i, Vn:is, Zn) when taking the (n+l)-th image, can be obtained, with the unit of mm. It can be obtained from the attitude and heading reference system that when taking images, the coordinate system with the camera as the origin (hereinafter referred to as: camera coordinate system) rotates to the real map coordinate system, and the rotation angles around the x axis, y axis and z axis are o, B and y (unit: °). The following rotation matrix is formed: cos fcosy cosasiny+sinasm cosy sinasiny-cososin Bcosy R =| —cosfBsiny cosacosy—sinasinfsiny sinorcosy+cosasinBsiny | sin B —sina cos 3 cosa cos B | 5207, the relative height of the unmanned aerial vehicle from the canopy of farmland surface crops is determined according to the position of the unmanned aerial vehicle in space and the position of the matching feature point pair in space. The height difference between the surface position where the feature points are located and the aircraft in the vertical direction and the current three-dimensional space coordinates of the unmanned aerial vehicle are interpolated to obtain the height of the unmanned aerial vehicle relative to the canopy surface of the surface crops. 5208, the flight height of the unmanned aerial vehicle is determined according to the relative height. FIG. 4 is a structural schematic diagram of a system for determining the flight height of unmanned aerial vehicles according to the present disclosure. As shown in FIG. 4, the system for determining the flight height of unmanned aerial vehicles according to the present disclosure comprises a data acquiring module 401, a Gaussian difference pyramid constructing module 402, a feature point determining module 403, a matching feature point pair determining module 404, a position determining module of a matching feature point pair in space 405, a position determining module of an unmanned aerial vehicle in space 406, a relative height determining module 407 and a flight height determining module 408. The data acquiring module 401 is configured to acquire a plurality of ground orthographic images in an agricultural operation area and the position data and attitude data of an unmanned aerial vehicle at the time of acquiring the plurality of ground orthographic images. The Gaussian difference pyramid constructing module 402 is configured to construct a Gaussian difference pyramid according to pixel points in any two adjacent ground orthographic images. The feature point determining module 403 is configured to determine the feature points in any two adjacent ground orthographic images according to the Gaussian difference pyramid. The matching feature point pair determining module 404 is configured to match the feature points in two adjacent ground orthographic images to determine a matching feature point pair. The position determining module of a matching feature point pair in space 405 is configured to determine the position of the matching feature point pair in space according to the matching feature point pair. The position determining module of an unmanned aerial vehicle in space 406 is configured to determine the position of the unmanned aerial vehicle in space according to the position data and attitude data of the unmanned aerial vehicle. The relative height determining module 407 is configured to determine the relative height of the unmanned aerial vehicle from the canopy of farmland surface crops according to the position of the unmanned aerial vehicle in space and the position of the matching feature point pair in space. The flight height determining module 408 is configured to determine the flight height of the unmanned aerial vehicle according to the relative height. The system for determining the flight height of the unmanned aerial vehicle according to the present disclosure further comprises a pixel point calibrating module. The pixel point calibrating module is configured to calibrate pixel points in the ground orthographic image by using formulas Korea = X1+ Kr + kr + kr) Yeoretos = V1 + kr + kr + kr) X reed = X +[2DXy + pr + 2x%)] and Voretoa = V FIDE +295) + 2000] where ki, ks, and ks are radial distortion factors, p: and Pp: are tangential distortion factors, x and y are pixel point coordinates, Xcorrected and Veorrectea are the coordinates of the calibrated pixel points, and r is the distance from the pixel point to the image center point. The matching feature point pair determining module specifically comprises: a feature description vector constructing unit, an Euclidean distance determining unit, and a matching feature point pair determining unit. The feature description vector constructing unit is configured to construct a feature description vector corresponding to each feature point according to the feature points in the ground orthographic image. The Euclidean distance determining unit is configured to determine the Euclidean distance between the feature points in one ground orthographic image and the feature points in another ground orthographic image according to the feature description vector corresponding to each feature point. The matching feature point pair determining unit is configured to take the feature point pair whose Euclidean distance is less than the distance threshold as the matching feature point pair, wherein the matching feature point pair comprises two feature points and is located in different ground orthographic images. In this specification, each embodiment is described in a progressive manner, and each embodiment {focuses on the differences from other embodiments. It is sufficient to refer to the same and similar parts between each embodiment. For the system disclosed in the embodiment, because it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points can be found in the description of the method. In the present disclosure, a specific example is applied to illustrate the principle and implementation of the present disclosure. The explanation of the above embodiments is only used to help understand the method and its core idea of the present disclosure; at the same time, according to the idea of the present disclosure, there will be some changes in the specific implementation and application scope for those skilled in the art. To sum up, the contents of this specification should not be construed as limiting the present disclosure. IN THE DRAWINGS FIG. 2 3201 A plurality of ground orthographic images in an agricultural operation area and the position data and attitude data of an unmanned aerial vehicle at the time of acquiring the plurality of ground orthographic images are acquired. 5202 A Gaussian difference pyramid is constructed according to pixel points in any two adjacent ground orthographic images. 38203 The feature points in any two adjacent ground orthographic images are determined according to the Gaussian difference pyramid. 8204 The feature points are matched in two adjacent ground orthographic images to determine amatching feature point pair. S205 The position of the matching feature point pair in space is determined according to the matching feature point pair. 5206 The position of the unmanned aerial vehicle in space is determined according to the position data and attitude data of the unmanned aerial vehicle. 5207 The relative height of the unmanned aerial vehicle from the canopy of farmland surface crops is determined according to the position of the unmanned aerial vehicle in space and the position of the matching feature point pair in space. The flight height of the unmanned aerial vehicle is determined according to the relative height. FIG. 4 401 data acquiring module 402 Gaussian difference pyramid constructing module 403 feature point determining module 404 matching feature point pair determining module 405 position determining module of a matching feature point pair in space 406 position determining module of an unmanned aerial vehicle in space 407 relative height determining module 408 flight height determining module
权利要求:
Claims (7) [1] An apparatus for determining the flight height of an unmanned aircraft, comprising an unmanned aircraft, a three-axis self-stabilizing tripod head, a carrier phase difference satellite positioning system, an attitude and heading reference system, a graphics processing computer and a power supply system, wherein ; a camera at the bottom of the unmanned aircraft is carried on the three-axis self-stabilizing tripod head, and the three-axis self-stabilizing tripod head is used to hold the optical axis of the camera; the carrier phase difference satellite positioning system, the graphics processing computer and the power supply system are all mounted on the top of the unmanned aircraft; the graphics processing computer is connected to the carrier phase difference satellite positioning system, the power system and the camera, respectively; the attitude and heading reference system is mounted on the bottom of the three-axis self-stabilizing tripod head and is connected to the graphics processing computer; the graphics processing computer is configured to determine the relative height of the unmanned aircraft from the canopy of agricultural land crops according to the position information obtained by the carrier phase difference satellite positioning system, the attitude information obtained by the attitude and heading reference system and the orthographic ground image obtained by the camera, and to determine the flight altitude of the unmanned aircraft from its relative altitude. [2] Apparatus for determining the flight height of an unmanned aircraft according to claim 1, wherein the camera is a high resolution visible light camera. [3] A method of determining the altitude of an unmanned aircraft, the determination method being applied to the device for determining the altitude of an unmanned aircraft according to any one of claims 1-2, and the determination method comprising: obtaining multiple orthographic ground images in an agricultural operation area and the position data and attitude data of an unmanned Aircraft at the time of obtaining the multiple orthographic ground images; constructing a Gaussian difference pyramid from pixel points in two adjacent orthographic ground maps; determining the feature points in two adjacent orthographic ground images based on the Gaussian difference pyramid; pairing the feature points in two adjacent orthographic ground images to determine a corresponding feature point pair; determining the position of the matching feature point pair in the space based on the matching feature point pair; determining the position of the unmanned aircraft in space on the basis of the position data and attitude data of the unmanned aircraft; determining the relative height of the unmanned aircraft from the canopy of agricultural land crops based on the position of the unmanned aircraft in space and the position of the corresponding feature point pairs in space; determining the flight altitude of the unmanned aircraft based on the relative altitude. [4] The method of determining the flight altitude of an unmanned aircraft according to claim 3, wherein prior to constructing a Gaussian difference pyramid based on pixel points in two adjacent orthographic ground images, the method further comprises calibrating pixel points in the orthographic ground image by using the equations koreas = ml kr + kr + kr) i Yerreotoa = VO kr + kr + kor) Xoorrected = X t [20XY + pire + 2x%) 1 and Yeoreaos = V HAVE + 2, + 209%] where ki, kz, and ks; radial distortion factors, p: and pz are tangential distortion factors, x and y pixel point coordinates, Xcorrecteg @N Veorrectea are the coordinates of the calibrated pixel points, and r is a distance from the pixel point to the center of the image. [5] The method of determining the flight altitude of an unmanned aircraft according to claim 3, wherein pairing the feature points in two adjacent orthographic ground images to determine a corresponding feature point pair specifically comprises: constructing a feature description vector corresponding to each feature point based on the feature points in the orthographic basic image; determining the Euclidean distance between the feature points in one orthographic ground image and the feature points in another orthographic ground image based on the feature description vector corresponding to each feature point; taking the feature point pair whose Euclidean distance is less than the distance threshold as the matching feature point pair, the matching feature point pair comprising two feature points and located in different orthographic ground images. [6] An assembly for determining the flight height of an unmanned aircraft, comprising: a data acquisition module for obtaining a plurality of orthographic ground images in an agricultural area of operations and the position data and attitude data of an unmanned aircraft at the time of obtaining the multiple orthographic ground maps; a Gaussian difference pyramid constructing module configured to construct a Gaussian difference pyramid based on pixel points in two adjacent orthographic ground maps; a feature point determining module configured to determine the feature points in two adjacent orthographic ground images based on the Gaussian Difference Pyramid; a matching feature point pair determining module configured to pair the feature points in two adjacent orthographic ground images to determine a matching feature point pair; a position determining module of a matching feature point pair in space configured to determine the position of the matching feature point pair in space based on the matching feature point pair; an unmanned aircraft position determining module in space configured to determine the position of the unmanned aircraft in space from the position data and attitude data of the unmanned aircraft; a relative height determining module configured to determine the relative height of the unmanned aircraft from the canopy of farmland crops based on the position of the unmanned aircraft in space and the position of the corresponding feature point pairs in space; a flight height determining module configured to determine the flight height of the unmanned aircraft based on the relative height. [7] An unmanned aircraft flight altitude determination assembly according to claim 6, further comprising: a pixel point calibration module configured to calibrate pixel points in the ground orthographic image using the equations Korat = Mike krt + kr " ) Yeoreaas = MAKES kr + kr) Xrated = X + 120xy + PAT +23] <p Front = ¥ + LP (r "+22) + 2pxy] where ki, ko, and kj; radial distortion factors, p: and pz are tangential distortion factors, x and y are pixel point coordinates, Xcorrected @N Yeorrecreg are the coordinates of the calibrated pixel points, and r is a distance from the pixel point to the center of the image. The flight altitude determination assembly of an unmanned aircraft according to claim 6, wherein the corresponding feature point pair determining module specifically comprises: a feature description vector constructing unit configured to construct a feature description vector corresponding to each feature point based on the feature points in the orthographic basic image; a Euclidean distance determining unit configured to determine the Euclidean distance between the feature points in one orthographic ground image and the feature points in another orthographic ground image based on the feature description vector corresponding to each feature point; a matching feature point pair determining unit configured to take the feature point pair whose Euclidean distance is less than the distance threshold as the matching feature point pair, the matching feature point pair comprising two feature points and located in different orthographic ground images. -0-0-0-0-0-0-0-0-
类似技术:
公开号 | 公开日 | 专利标题 CA2534968C|2013-06-18|Vehicle based data collection and processing system Samad et al.2013|The potential of Unmanned Aerial Vehicle | for civilian and mapping application US20160313435A1|2016-10-27|Self-calibrated, remote imaging and data processing system US7725258B2|2010-05-25|Vehicle based data collection and processing system and imaging sensor system and methods thereof US8994822B2|2015-03-31|Infrastructure mapping system and method CN103557841B|2016-08-10|A kind of method improving polyphaser resultant image photogrammetric accuracy EP2201326B1|2013-07-31|Method for determining distance WO2004027348A2|2004-04-01|A method of using a self-locking travel pattern to achieve calilbration of remote sensors using conventionally collected data CN109597095A|2019-04-09|Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method JP6282275B2|2018-03-07|Infrastructure mapping system and method CN110244282B|2021-06-15|Multi-camera system and laser radar combined system and combined calibration method thereof CN107146256B|2019-07-05|Camera marking method under outfield large viewing field condition based on differential global positioning system CN109461190A|2019-03-12|Measurement and Data Processing device and Measurement and Data Processing method CA2796162A1|2012-10-04|Self-calibrated, remote imaging and data processing system Madawalagama et al.2016|Low cost aerial mapping with consumer-grade drones CN108896957A|2018-11-27|The positioning system and method in a kind of unmanned plane control signal source Hill et al.2017|Ground-to-air flow visualization using Solar Calcium-K line Background-Oriented Schlieren Bybee et al.2019|Method for 3-D scene reconstruction using fused LiDAR and imagery from a texel camera NL2027359B1|2021-11-09|Device, method and system for determining flight height of unmanned aerial vehicle Guntel et al.2018|Accuracy analysis of control point distribution for different terrain types on photogrammetric block Olawale et al.2015|A Four-Step Ortho-Rectification Procedure for Geo-Referencing Video Streams from a Low-Cost UAV CN210592433U|2020-05-22|Multi-rotor unmanned aerial vehicle image-control-point-free three-dimensional modeling and mapping device Baron et al.2003|ICC experiences on Inertial/GPS sensor orientation Nekmat et al.2018|Assessment of Generated DTM Model Using UAV Sensors Toward Earthwork Calculation Xu et al.2018|Geo-location for Ground Target with Multiple Observations Using Unmanned Aerial Vehicle
同族专利:
公开号 | 公开日 US20220044434A1|2022-02-10| NL2027359B1|2021-11-09| CN111932622A|2020-11-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20190206073A1|2016-11-24|2019-07-04|Tencent Technology Company Limited|Aircraft information acquisition method, apparatus and device|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 CN202010794874.4A|CN111932622A|2020-08-10|2020-08-10|Device, method and system for determining flying height of unmanned aerial vehicle| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|